Efficient Alternating Least Squares Algorithms for Low Multilinear Rank Approximation of Tensors
نویسندگان
چکیده
The low multilinear rank approximation, also known as the truncated Tucker decomposition, has been extensively utilized in many applications that involve higher-order tensors. Popular methods for approximation usually rely directly on matrix SVD, therefore often suffer from notorious intermediate data explosion issue and are not easy to parallelize, especially when input tensor is large. In this paper, we propose a new class of HOSVD algorithms based alternating least squares (ALS) efficiently computing proposed ALS-based approaches able eliminate redundant computations singular vectors matrices free explosion. Also, more flexible with adjustable convergence tolerance intrinsically parallelizable high-performance computers. Theoretical analysis reveals ALS iteration q-linear convergent relatively wide region. Numerical experiments large-scale tensors both synthetic real-world demonstrate can substantially reduce total cost original ones highly scalable parallel computing.
منابع مشابه
Convergence of Alternating Least Squares Optimisation for Rank-One Approximation to High Order Tensors
The approximation of tensors has important applications in various disciplines, but it remains an extremely challenging task. It is well known that tensors of higher order can fail to have best low-rank approximations, but with an important exception that best rank-one approximations always exists. The most popular approach to low-rank approximation is the alternating least squares (ALS) method...
متن کاملStructure-Preserving Low Multilinear Rank Approximation of Antisymmetric Tensors
This paper is concerned with low multilinear rank approximations to antisymmetric tensors, that is, multivariate arrays for which the entries change sign when permuting pairs of indices. We show which ranks can be attained by an antisymmetric tensor and discuss the adaption of existing approximation algorithms to preserve antisymmetry, most notably a Jacobi algorithm. Particular attention is pa...
متن کاملRandom Projections for Low Multilinear Rank Tensors
We proposed two randomized tensor algorithms for reducing multilinear ranks in the Tucker format. The basis of these randomized algorithms is from the randomized SVD of Halko, Martinsson and Tropp [9]. Here we provide randomized versions of the higher order SVD and higher order orthogonal iteration. Moreover, we provide a sharper probabilistic error bounds for the matrix low rank approximation....
متن کاملMultilinear Low-Rank Tensors on Graphs & Applications
We propose a new framework for the analysis of lowrank tensors which lies at the intersection of spectral graph theory and signal processing. As a first step, we present a new graph based low-rank decomposition which approximates the classical low-rank SVD for matrices and multilinear SVD for tensors. Then, building on this novel decomposition we construct a general class of convex optimization...
متن کاملOn the Global Convergence of the Alternating Least Squares Method for Rank-One Approximation to Generic Tensors
Tensor decomposition has important applications in various disciplines, but it remains an extremely challenging task even to this date. A slightly more manageable endeavor has been to find a low rank approximation in place of the decomposition. Even for this less stringent undertaking, it is an established fact that tensors beyond matrices can fail to have best low rank approximations, with the...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Scientific Computing
سال: 2021
ISSN: ['1573-7691', '0885-7474']
DOI: https://doi.org/10.1007/s10915-021-01493-0